A Sparse Decomposition of Low Rank Symmetric Positive Semidefinite Matrices | Multiscale Modeling & Simulation | Vol. 15, No. 1 | Society for Industrial and Applied Mathematics

نویسندگان

  • THOMAS Y. HOU
  • PENGCHUAN ZHANG
چکیده

Abstract. Suppose that A ∈ RN×N is symmetric positive semidefinite with rank K ≤ N . Our goal is to decompose A into K rank-one matrices ∑K k=1 gkg T k where the modes {gk} K k=1 are required to be as sparse as possible. In contrast to eigendecomposition, these sparse modes are not required to be orthogonal. Such a problem arises in random field parametrization where A is the covariance function and is intractable to solve in general. In this paper, we partition the indices from 1 to N into several patches and propose to quantify the sparseness of a vector by the number of patches on which it is nonzero, which is called patchwise sparseness. Our aim is to find the decomposition which minimizes the total patchwise sparseness of the decomposed modes. We propose a domaindecomposition type method, called intrinsic sparse mode decomposition (ISMD), which follows the “local-modes-construction + patching-up” procedure. The key step in the ISMD is to construct local pieces of the intrinsic sparse modes by a joint diagonalization problem. Thereafter, a pivoted Cholesky decomposition is utilized to glue these local pieces together. Optimal sparse decomposition, consistency with different domain decomposition, and robustness to small perturbation are proved under the so-called regular-sparse assumption (see Definition 1.2). We provide simulation results to show the efficiency and robustness of the ISMD. We also compare the ISMD to other existing methods, e.g., eigendecomposition, pivoted Cholesky decomposition, and convex relaxation of sparse principal component analysis [R. Lai, J. Lu, and S. Osher, Comm. Math. Sci., to appear; V. Q. Vu, J. Cho, J. Lei, and K. Rohe, Fantope projection and selection: A near-optimal convex relaxation of sparse PCA, in Proceedings in Advances in Neural Information Processing Systems 26, 2013, pp. 2670–2678].

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Sparse Decomposition of Low Rank Symmetric Positive Semidefinite Matrices

Suppose that A ∈ RN×N is symmetric positive semidefinite with rank K ≤ N . Our goal is to decompose A into K rank-one matrices ∑K k=1 gkg T k where the modes {gk} K k=1 are required to be as sparse as possible. In contrast to eigen decomposition, these sparse modes are not required to be orthogonal. Such a problem arises in random field parametrization where A is the covariance function and is ...

متن کامل

A Recursive Skeletonization Factorization Based on Strong Admissibility | Multiscale Modeling & Simulation | Vol. 15, No. 2 | Society for Industrial and Applied Mathematics

We introduce the strong recursive skeletonization factorization (RS-S), a new approximate matrix factorization based on recursive skeletonization for solving discretizations of linear integral equations associated with elliptic partial differential equations in two and three dimensions (and other matrices with similar hierarchical rank structure). Unlike previous skeletonizationbased factorizat...

متن کامل

Singular value inequalities for positive semidefinite matrices

In this note‎, ‎we obtain some singular values inequalities for positive semidefinite matrices by using block matrix technique‎. ‎Our results are similar to some inequalities shown by Bhatia and Kittaneh in [Linear Algebra Appl‎. ‎308 (2000) 203-211] and [Linear Algebra Appl‎. ‎428 (2008) 2177-2191]‎.

متن کامل

Robust Approximate Cholesky Factorization of Rank-Structured Symmetric Positive Definite Matrices

Given a symmetric positive definite matrix A, we compute a structured approximate Cholesky factorization A ≈ RTR up to any desired accuracy, where R is an upper triangular hierarchically semiseparable (HSS) matrix. The factorization is stable, robust, and efficient. The method compresses off-diagonal blocks with rank-revealing orthogonal decompositions. In the meantime, positive semidefinite te...

متن کامل

Chordal Graphs and Semidefinite Optimization

Chordal graphs play a central role in techniques for exploiting sparsity in large semidefinite optimization problems and in related convex optimization problems involving sparse positive semidefinite matrices. Chordal graph properties are also fundamental to several classical results in combinatorial optimization, linear algebra, statistics, signal processing, machine learning, and nonlinear op...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017